Em Algorithm and Mixtures. 1.1 Introduction
نویسنده
چکیده
The Expectation-Maximization (EM) iterative algorithm is a broadly applicable statistical technique for maximizing complex likelihoods and handling the incomplete data problem. At each iteration step of the algorithm, two steps are performed: (i) E-Step consisting of projecting an appropriate functional containing the augmented data on the space of the original, incomplete data, and (ii) M-Step consisting of maximizing the functional. The name EM algorithm was coined by Dempster, Laird, and Rubin in their fundamental paper [1], often referred to as DLR paper. But if one comes up with smart idea, one may be sure that other smart guys in history thought about it. The EM algorithm relates to MCMC as a forerunner by its data augmentation step that replaces simulation by maximization. Newcomb [7] was interested in estimating the mixtures of normals in 1886. McKendrick [5] and Healy and Westmacott [3] proposed iterative methods that, in fact, are examples of the EM algorithm. Dozens of papers proposing various applications of EM appeared before the DLR paper in 1997. However, the DLR paper was the first to unify and organize the approach.
منابع مشابه
Approximation of Unknown Multivariate Probability Distributions by Using Mixtures of Product Components: A Tutorial
In literature the references to EM estimation of product mixtures are not very frequent. The simplifying assumption of product components, e.g. diagonal covariance matrices in case of Gaussian mixtures, is usually considered only as a compromise because of some computational constraints or limited dataset. We have found that the product mixtures are rarely used intentionally as a preferable app...
متن کاملAn Application of the EM-algorithm to Approximate Empirical Distributions of Financial Indices with the Gaussian Mixtures
In this study I briefly illustrate application of the Gaussian mixtures to approximate empirical distributions of financial indices (DAX, Dow Jones, Nikkei, RTSI, S&P 500). The resulting distributions illustrate very high quality of approximation as evaluated by Kolmogorov-Smirnov test. This implies further study of application of the Gaussian mixtures to approximate empirical distributions of ...
متن کاملEM for Spherical Gaussians
In this project, we examine two aspects of the behavior of the EM algorithm for mixtures of spherical Gaussians; 1) the benefit of spectral projection for such mixtures, and 2) the general behavior of the EM algorithm under certain separability criteria. Our current results are for mixtures of two Gaussians, although these can be extended. In the case of 1), we show that the value of the Q func...
متن کاملTen Steps of EM Suffice for Mixtures of Two Gaussians
We provide global convergence guarantees for the expectation-maximization (EM) algorithm applied to mixtures of two Gaussians with known covariance matrices. We show that EM converges geometrically to the correct mean vectors, and provide simple, closed-form expressions for the convergence rate. As a simple illustration, we show that in one dimension ten steps of the EM algorithm initialized at...
متن کاملChoosing initial values for the EM algorithm for finite mixtures
The EM algorithm is the standard tool for maximum likelihood estimation in )nite mixture models. The main drawbacks of the EM algorithm are its slow convergence and the dependence of the solution on both the stopping criterion and the initial values used. The problems referring to slow convergence and the choice of a stopping criterion have been dealt with in literature and the present paper de...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 2005